An Improved DCC Model Based on Large-Dimensional Covariance Matrices Estimation and Its Applications

نویسندگان

چکیده

The covariance matrix estimation plays an important role in portfolio optimization and risk management. It is well-known that essentially a convex quadratic programming problem, which also special case of symmetric cone optimization. Accurate will lead to more reasonable asset weight allocation. However, some existing methods do not consider the influence time-varying factor on estimations. To remedy this, this article, we propose improved dynamic conditional correlation model (DCC) by using nonconvex under smoothly clipped absolute deviation hard-threshold penalty functions. We first construct obtain optimal estimation, then use replace unconditional DCC model. result shows loss proposed estimator smaller than other variants numerical experiments. Finally, apply our classic Markowitz portfolio. results show performs better current models.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Estimation of Large Covariance Matrices

This paper considers estimating a covariance matrix of p variables from n observations by either banding or tapering the sample covariance matrix, or estimating a banded version of the inverse of the covariance. We show that these estimates are consistent in the operator norm as long as (logp)/n→ 0, and obtain explicit rates. The results are uniform over some fairly natural well-conditioned fam...

متن کامل

Operator norm consistent estimation of large dimensional sparse covariance matrices

Estimating covariance matrices is a problem of fundamental importance in multivariate statistics. In practice it is increasingly frequent to work with data matrices X of dimension n × p, where p and n are both large. Results from random matrix theory show very clearly that in this setting, standard estimators like the sample covariance matrix perform in general very poorly. In this “large n, la...

متن کامل

Direct Nonlinear Shrinkage Estimation of Large-Dimensional Covariance Matrices

This paper introduces a nonlinear shrinkage estimator of the covariance matrix that does not require recovering the population eigenvalues first. We estimate the sample spectral density and its Hilbert transform directly by smoothing the sample eigenvalues with a variable-bandwidth kernel. Relative to numerically inverting the so-called QuEST function, the main advantages of direct kernel estim...

متن کامل

Consistent Estimation of Large - Dimensional Sparse Covariance Matrices

Estimating covariance matrices is a problem of fundamental importance in multivariate statistics. In practice it is increasingly frequent to work with data matrices X of dimension n×p, where p and n are both large. Results from random matrix theory show very clearly that in this setting, standard estimators like the sample covariance matrix perform in general very poorly. In this “large n, larg...

متن کامل

Nonlinear shrinkage estimation of large-dimensional covariance matrices

Many statistical applications require an estimate of a covariance matrix and/or its inverse. Whenthe matrix dimension is large compared to the sample size, which happens frequently, the samplecovariance matrix is known to perform poorly and may suffer from ill-conditioning. There alreadyexists an extensive literature concerning improved estimators in such situations. In the absence offurther kn...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Symmetry

سال: 2023

ISSN: ['0865-4824', '2226-1877']

DOI: https://doi.org/10.3390/sym15040953